New parametric measures of information

نویسندگان
چکیده

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Dynamic Bayesian Information Measures

This paper introduces measures of information for Bayesian analysis when the support of data distribution is truncated progressively. The focus is on the lifetime distributions where the support is truncated at the current age t>=0. Notions of uncertainty and information are presented and operationalized by Shannon entropy, Kullback-Leibler information, and mutual information. Dynamic updatings...

متن کامل

Information Measures via Copula Functions

In applications of differential geometry to problems of parametric inference, the notion of divergence is often used to measure the separation between two parametric densities. Among them, in this paper, we will verify measures such as Kullback-Leibler information, J-divergence, Hellinger distance, -Divergence, … and so on. Properties and results related to distance between probability d...

متن کامل

New Generalized Parametric Measures of Entropy and Cross Entropy

The measure of entropy introduced by Shannon [12] is the key concept in the literature of information theory and has found tremendous applications in different disciplines of science and technology. The various researchers have generalized this entropy with different approaches. The object of the present manuscript is to develop a generalized measure of entropy by using the property of concavit...

متن کامل

A new outlook of Shannon's information measures

Abstact --Let. Xi, i = 1,. . . , n, be discrete random variables, and Xi be a set variable corresponding to Xi. Define the universal set to be U ; = IXi and let S be the U-field generated by (Xi, i = 1,. . . , n}. It is shown that Shannon’s information measures on the random variables Xi, i = 1; .,n, constitute a unique measure p* on F, which is called the I-Measure. In other words, the Shannon...

متن کامل

Non-parametric Information-Theoretic Measures of One-Dimensional Distribution Functions from Continuous Time Series

We study non-parametric measures for the problem of comparing distributions, which arise in anomaly detection for continuous time series. Non-parametric measures take two distributions as input and produce two numbers as output: the difference between the input distributions and the statistical significance of this difference. Some of these measures, such as Kullback-Leibler measure, are define...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Information and Control

سال: 1981

ISSN: 0019-9958

DOI: 10.1016/s0019-9958(81)90263-1